The previous section provided the formal proofs necessary to establish complexity bounds, demonstrating that function dominance is absolute for large values of $n$. We now apply this foundational understanding to recognize and differentiate the standard hierarchy of complexity classes in algorithmic analysis.

The visualization below illustrates why we say that $O(n)$ dominates $O(\log n)$ and why the growth rate of exponential functions, $O(2^n)$, makes them infeasible for large $n$.

Exercise: Identify the Dominant Term

Determine the tightest Big-O complexity $O(g(n))$ for the following function costs $f(n)$:

  • $f(n) = 100 \cdot n + 5000 \cdot \log2(n) + 12$
    The dominating term is $n$. Answer: $O(n)$
  • $f(n) = 0.5 \cdot n \cdot \log2(n) + 3n^2$
    The dominating term is $n^2$. Answer: $O(n^2)$
  • $f(n) = 42 + 10$
    This is a constant time operation. Answer: $O(1)$
  • $f(n) = n^3 + n!$
    Factorial $n!$ is significantly slower than polynomial $n^3$. Answer: $O(n!)$

Common Complexity Classes

Complexity Name Example
$O(1)$ Constant Array index lookup
$O(\log n)$ Logarithmic Binary Search
$O(n)$ Linear Iterating through a list
$O(n \log n)$ Log-linear Efficient sorting (Merge Sort)
$O(n^2)$ Quadratic Nested loops (Bubble Sort)
$O(2^n)$ Exponential Recursive Fibonacci
$O(n!)$ Factorial Traveling Salesperson (brute-force)